Solving norm constrained portfolio optimization via coordinate-wise descent algorithms

نویسندگان

  • Yu-Min Yen
  • Tso-Jung Yen
چکیده

In this paper we demonstrate that coordinate-wise descent algorithms can be used to solve portfolio selection problems in which asset weights are constrained by l q norms for 1 ≤ q ≤ 2. A special case of the such problems is when q = 1. The l 1 norm constraint promotes zero values for the weight vector, leading to an automatic asset selection for the portfolio. We first consider the case of minimum (global) variance portfolio (mvp) in which the asset weights are constrained by weighted l 1 and squared l 2 norms. We use two benchmark data sets to examine performances of the norm constrained portfolio. When the sample size is not large in comparison with the number of assets, the norm constrained portfolio tends to have a lower out-of-sample portfolio variance, lower turnover rate, less numbers of active assets and short-sale positions, but higher Sharpe ratio than the one without such norm constraints. We then show some extensions; particularly we derive an efficient algorithm for solving an mvp problem in which assets are allowed to be chosen grouply. All of the program codes for the algorithms are written by R.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Primal-Dual methods for sparse constrained matrix completion

We develop scalable algorithms for regular and non-negative matrix completion. In particular, we base the methods on trace-norm regularization that induces a low rank predicted matrix. The regularization problem is solved via a constraint generation method that explicitly maintains a sparse dual and the corresponding low rank primal solution. We provide a new dual block coordinate descent algor...

متن کامل

A New First-order Framework for Orthogonal Constrained Optimization Problems

In this paper, we consider a class of orthogonal constrained optimization problems, the feasible region of which is called the Stiefel manifold. Our new proposed framework combines a function value reduction step with a symmetrization step. Different from the existing approaches, the function value reduction is conducted in the Euclidean space instead of the Stiefel manifold or its tangent spac...

متن کامل

Sparsity Constrained Nonlinear Optimization: Optimality Conditions and Algorithms

This paper treats the problem of minimizing a general continuously differentiable function subject to sparsity constraints. We present and analyze several different optimality criteria which are based on the notions of stationarity and coordinate-wise optimality. These conditions are then used to derive three numerical algorithms aimed at finding points satisfying the resulting optimality crite...

متن کامل

A Coordinate Majorization Descent Algorithm for l1 Penalized Learning

The glmnet package by [1] is an extremely fast implementation of the standard coordinate descent algorithm for solving l1 penalized learning problems. In this paper, we consider a family of coordinate majorization descent algorithms for solving the l1 penalized learning problems by replacing each coordinate descent step with a coordinate-wise majorization descent operation. Numerical experiment...

متن کامل

Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization

In this paper we analyze several new methods for solving nonconvex optimization problems with the objective function formed as a sum of two terms: one is nonconvex and smooth, and another is convex but simple and its structure is known. Further, we consider both cases: unconstrained and linearly constrained nonconvex problems. For optimization problems of the above structure, we propose random ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Computational Statistics & Data Analysis

دوره 76  شماره 

صفحات  -

تاریخ انتشار 2014